AAAI AI-Alert for Feb 26, 2019
AI Plays Games
Artificial Intelligence (AI) was all the rage in the 1980s. Specifically, companies invested heavily to build expert systems โ AI applications that captured the knowledge of acknowledged human experts and made it available to solve narrowly defined types of problems. Thus, expert systems were created to configure complex computer systems and to detect likely credit card fraud. This earlier round of AI was triggered by a series of successful academic expert applications created at Stanford University. Dendral analyzed mass spectra data and identified organic molecules โ something that, previously, only a few chemists could do. Another expert systems was called Mycin, and it analyzed potential of meningitis infections. In a series of tests, it was shown that Mycin could analyze meningitis as well as human meningitis experts, and it even did slightly better, since it never overlooked possible drug incompatibility issues. The expert systems developed in the Eighties all followed the general approach followed by Dendral and Mycin.
Are you being scanned? How facial recognition technology follows you, even as you shop
If you shop at Westfield, you've probably been scanned and recorded by dozens of hidden cameras built into the centres' digital advertising billboards. The semi-camouflaged cameras can determine not only your age and gender but your mood, cueing up tailored advertisements within seconds, thanks to facial detection technology. Westfield's Smartscreen network was developed by the French software firm Quividi back in 2015. Their discreet cameras capture blurry images of shoppers and apply statistical analysis to identify audience demographics. And once the billboards have your attention they hit record, sharing your reaction with advertisers.
Bionic Hands Let Amputees Feel and Grip
If you're sitting near a coffee mug, pick it up, and note how easy it is to do without really looking. You feel the curvature of the handle, the width of the cup, the slipperiness of the ceramic. Your hand glides into place and you squeeze, getting a sense of the weight, and bring the cup to your mouth. Now, imagine trying to do that with a robotic hand that gives you no sensory feedback. You get no information about the tiny adjustments that your fingers must make in order to grasp it properly.
When Is Technology Too Dangerous to Release to the Public?
Last week, the nonprofit research group OpenAI revealed that it had developed a new text-generation model that can write coherent, versatile prose given a certain subject matter prompt. However, the organization said, it would not be releasing the full algorithm due to "safety and security concerns." Instead, OpenAI decided to release a "much smaller" version of the model and withhold the data sets and training codes that were used to develop it. If your knowledge of the model, called GPT-2, came solely on headlines from the resulting news coverage, you might think that OpenAI had built a weapons-grade chatbot. A headline from Metro U.K. read, "Elon Musk-Founded OpenAI Builds Artificial Intelligence So Powerful That It Must Be Kept Locked Up for the Good of Humanity."
A philosopher argues that an AI can never be an artist
On March 31, 1913, in the Great Hall of the Musikverein concert house in Vienna, a riot broke out in the middle of a performance of an orchestral song by Alban Berg. Police arrested the concert's organizer for punching Oscar Straus, a little-remembered composer of operettas. Later, at the trial, Straus quipped about the audience's frustration. The punch, he insisted, was the most harmonious sound of the entire evening. History has rendered a different verdict: the concert's conductor, Arnold Schoenberg, has gone down as perhaps the most creative and influential composer of the 20th century. You may not enjoy Schoenberg's dissonant music, which rejects conventional tonality to arrange the 12 notes of the scale according to rules that don't let any predominate. But he changed what humans understand music to be. This is what makes him a genuinely creative and innovative artist.
How Self-Driving Cars Might Transform City Parking
Autonomous vehicles could transform parking as well as driving, new research suggests. Parking lots could house more driverless cars than human-driven ones, but autonomous vehicles could also lead to nightmarish gridlock if they slowly cruise the streets waiting for their owners, instead of paying to park. The typical vehicle spends 95 percent of its lifetime parked. The need to store parked vehicles has turned a lot of potentially valuable real estate into parking garages--for example, in the United States, roughly 6,500 square miles of land is devoted to parking, which is larger than the entire state of Connecticut. Autonomous vehicles could, in principle, transform parking lots.
Call to Ban Killer Robots in Wars
A scientific coalition is urging a ban on the development of weapons governed by artificial intelligence. A scientific coalition is urging a ban on the development of weapons governed by artificial intelligence (AI), warning they may malfunction unpredictably and kill innocent people. The coalition has established the Campaign to Stop Killer Robots to lobby for an international accord. Said Human Rights Watch's Mary Wareham, autonomous weapons "are beginning to creep in. Drones are the obvious example, but there are also military aircraft that take off, fly, and land on their own; robotic sentries that can identify movement."
Drone no-fly zone to be widened after Gatwick chaos
The no-fly zone for drones around airports is to be extended following the disruption at Gatwick in December, the government says. From 13 March it will be illegal to fly a drone within three miles of an airport, rather than the current 0.6-mile (1km) exclusion zone. The government also said it wants police to have new stop and search powers to tackle drone misuse. Gatwick was shut for more than a day after drone sightings near the runway. It caused chaos for travellers, affecting more than 1,000 flights and about 140,000 passengers.
Machine Learning Is Contributing to the Reproducibility Crisis in Science
Machine learning systems and the use of big data sets has accelerated the reproducibility crisis in science, Genevera Allen says. Machine-learning techniques used by thousands of scientists to analyze data are contributing to the reproducibility crisis in science by producing results that are misleading and often wrong. Genevera Allen of Rice University warns scientists that if they don't improve their techniques they will be wasting both time and money. A growing amount of scientific research involves using machine learning software to analyze data that has already been collected. Allen says the answers they come up with are likely to be inaccurate or wrong because the software is identifying patterns that exist only in that data set and not the real world.